![]() Personal augmented reality
专利摘要:
A method, article of manufacture and system for receiving, from a user device located at an environment at which a user is viewing an event both with and without the use of the user device, image data containing an image of the environment; receiving a request from the user device for an item of information that relates to the event, for placement into the image of the environment; responsive to receiving the request, accessing a database to retrieve the item of information; generating a scaled image of the item of information based on dimensions of the environment; and transmitting the scaled image for placement into the image of the environment on the user device to generate an augmented reality image. 公开号:AU2012362467A1 申请号:U2012362467 申请日:2012-12-27 公开日:2014-06-12 发明作者:Matthew Scott Zises 申请人:eBay Inc; IPC主号:G06K9-62
专利说明:
WO 2013/101903 PCT/US2012/071770 PERSONAL AUGMENTED REALITY RELATED APPLICATION [0001] This application in an International Application which claims priority to U.S. Non Provisional Application No. 13/340,141, filed December 29, 2011, which application is incorporated herein by reference in its entirety. TECHNICAL FIELD [0002] This application relates to a method and system for providing augmented reality for the environment in which a user is watching an event live in a stadium. BACKGROUND [0003] An individual viewing a real-time event in person may desire information related to the real-time event, the people engaged in the real-time event, the locale of the real-time event, and even other viewers at the locale of the real-time event. Further, the viewing individual may desire that information in such as way as to augment his or her viewing of the event. Conventionally, the only such information available to the viewer is merely the fraction of the desired information, for example that information which entities controlling the real-time event may elect to provide at a scoreboard or other display at the real-time event. Even this information does not actually augment the real time event other than by being viewable in a general, and not personal (such as personal choice), way. Consequently, the individual viewer has no, or only a very limited, way of obtaining the desired information presented in a way that is most beneficial to that individual. In addition, sales organizations desire to target advertisements to the viewing individual at the locale, based on characteristics or location of the individual. Often, such organizations have only very limited ways of targeting such advertisements. BRIEF DESCRIPTION OF THE DRAWINGS [0004] The present disclosure is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which: [0005] FIG. 1 is a network diagram depicting a network system, according to one embodiment, having a client-server architecture configured for exchanging data over a network; [0006] FIG. 2 is a block diagram illustrating an example embodiment of a location-based incentive application: [0007] FIG. 3 is a block diagram illustrating an example embodiment of a location identification module; 1 WO 2013/101903 PCT/US2012/071770 [00081 FIG. 4 is a block diagram illustrating an example embodiment of an item identification module; [0009] FIG. 5 is a block diagram illustrating an example embodiment of an incentive module; [0010] FIG. 6 is a flow chart illustrating a method for augmenting personal, real-time, viewing of an event by a user at the location of the event; [00111 FIG. 7A is an illustration of a ballgame at a ballpark; [0012] FIG. 7B is an illustration of the ball game of FIG. 7A with statistics of the player on second base, as requested by the user; [00131 FIG. 8 is an illustration of seating and other locations in a ballpark; [0014] FIG. 9A is an illustration of one embodiment of personal augmentation for a user: [0015] FIG. 9B is an illustration of augmenting the personal view of a user with the personal augmentation of FIG. 9A; and [0016] FIG. 10 shows a diagrammatic representation of machine in the example form of a computer system within which a set of instructions may be executed to cause the machine to perform any one or more of the methodologies discussed herein. DETAILED DESCRIPTION [0017] Although the present disclosure has been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader scope of the disclosure. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. [0018] Example embodiments described herein provide systems and methods for augmenting one's experience at a real time in-person attendance of an event such as, in one embodiment, attending a baseball game at a ballpark. This may be accomplished by concurrently viewing the event at the ballpark using a camera such as on an iPhone, or a camera embodied in a mobile device such as an iPad. Such devices may herein be termed a "client device" or a "user device" or an iPhone or an Wad without limiting the type of device used. In example embodiments, environment image data containing an image of an environment is received by a system from a client device operated by a user, in one embodiment, viewing the ball game at the ballpark. [0019] The user may wish to use the user device to watch the game with statistics the user is most interested in, such as statistics for one team, for one particular player, for one position, and the like. The user may also desire to have an embodiment which tnay be used to determine specific attributes of the ball park, such as type and location of a restaurant, handicapped bathrooms, an exit closest to where the user parked, and the like. The user may send the system 2I WO 2013/101903 PCT/US2012/071770 a request for the desired information to augment the reality of the ball game being watched at the ballpark. [0020] Likewise, marketing organizations, whether online ecommerce organizations or conventional "brick and mortar" organizations, may desire an embodiment that allows them to target ads to the user. These ads may be for items that are based on the user's purchase history, interests, or on the location of an organization closest to the ballpark that sells such items. Given different histories and interests of people, two people could be watching the same game at the same ballpark but have different ads targeted to them. Using appropriate image scaling, a selection of statistic or ad to be targeted, and its placement into an indicated location of the environment as an augmentation, may be implemented. As used herein, the statistic or targeted ad may be referred to herein as an "item" for ease of reference, and without losing generality. An image of the selected item is scaled to a scale that is based on dimensions determined from the environment image data for the environment. The dimensions may be determined based on a calculated distance to a focal point of the indicated location in the environment and/or on a marker located in the image of the environment. The scaled item image is augmented into the image of the environment at the indicated location to generate an augmented reality image. In some embodiments, the scaled item may be oriented to match an orientation of the indicated location in the environment. Technology useful in implementing augmented reality may be seen in U.S. Patent Application Serial Number 13/283.416 filed October 27, 2011 and entitled "SYSTEM AND METHOD FOR VISUALIZATION OF ITEMS IN AN ENVIRONMENT USING AUGMENTED REALITY," particularly FIGS. 1-5 thereof, and the text associated therewith. The foregoing patent application is owned by the owner of the present patent and is hereby incorporated by reference herein in its entirety. [0021] By using embodiments of the present disclosure, a user may search for an item and augment an image of an environment with an image of the item that is a personal choice of the user. Because the user may create and view an augmented reality image of the environment including the selected item, the user easily views the selected item in the environment. Therefore, one or more of the methodologies discussed herein may obviate a need for time consuming data processing by the user in order to obtain requested statistics and the like. This may have the technical effect of reducing computing resources used by one or more devices within the system. Examples of such computing resources include, without limitation, processor cycles, network traffic, memory usage, storage space, and power consumption. [0022] Embodiments may be implemented using wireless or other suitable technology. As discussed, the user may be watching a live baseball game in a ballpark. The system may have stored in its database relevant information about ballparks of interest, the information including WO 2013/101903 PCT/US2012/071770 location of various points within each ballpark, perhaps by ballpark coordinates, or by other appropriate locating markers such as ballpark section numbers and seat numbers. The system may also have stored in its database the players of relevant teams (e.g., teams in the league in which the current team plays), by number, position, or other identification, and their statistics. In the discussion that follows, queries or requests that may be made to the system by the user may be implemented as selection of predetermined queries available on a user interface ('"UI") on a user device such as a UI of an iPad or an iPhone on which the user is watching the game live in the ballpark. For example, the user may select predetermined requests on the UI. This nay also be accomplished, for example, by the user employing a UT on the user device to send a non predetermined request (e.g., a request which is of the user's personal choice) to the system for statistics for the player being viewed, whose identity may also be sent to the system by the user by way of the UT, or may be otherwise recognized by the system by the technologies discussed more fully below. Similarly, the system database may have stored the team pitching rotation for starters and relievers for this and, perhaps, all relevant teams, and any players that are currently on the injured reserve list for each team. The database may also store the current league standing and the possible playoff scenarios based on current won/lost for each team, and the possibilities of wins and losses for teams leading up to the playoffs. [0023] In one embodiment, the user may use the camera of a user device such as an iPhone or an iPad to provide context, such as the baseball game the user is watching in a ballpark, as mentioned above, along with requested items of information. In response, the system and the display of the user device may show items popping up to augment the reality of the game that is being watched. While the user is looking at the field using an iPad, he or she may aim the camera of the iPad for a close-up of a particular player. The player's image may be rendered on the display of the iPad during the real time baseball game, and player statistics for that player may also be rendered on the display by the system, responsive to the user's request for the statistics. This may be accomplished, for example, by the user employing a UI on the user device to send a request to the system for statistics for the player being viewed, whose identity may be sent to the system by the user by way of the UTI, or otherwise recognized by the system as more fully discussed below. [00241 In another embodiment, the system may include a video recognition system that recognizes images of the players, for example, by the player's team uniform number, to enable the system to correlate team, uniform number, and player in order to transmit the correct statistics for viewing if so requested. The system may, if desired, point to the particular player by arrow or other indicator when augmenting the reality of in-person watching of the game live, by presenting the player's statistics on the user device. 4 WO 2013/101903 PCT/US2012/071770 [00251 In another embodiment, the camera of the iWad may point from player to player and the system may recognize the players, by number recognition, or by other technology recognition such as facial recognition algorithms, or by RFID identification from an RFID chip embedded in the player's uniform. The user's favorite players may be found by matching an earlier prepared list of the user's favorite players located in an appropriate database, including the players' numbers or other identification, with the player identity that is so-recognizable by the system as the camera pans across the field. The recognition may be made by using the technology discussed above, the numbers being compared to those of the user's favorite players by database search as the user device scans a player. The system may then transmit the requested player statistics for rendering on the iPad display along with, for example, arrows pointing to the various favorite players on the screen of the user device as the camera pans across the field of play. Similarly, the ball itself may contain an RFID chip, and the above technology used to augment the reality of the ballgame on the iPad by using the detected path of the RFID chip within the ball to illustrate the traiectory of the ball. [00261 Based on the above implementations, the user may request the statistics of a particular player; the statistics of his or her favorite player; how a player compares in various statistic areas with other players at the same position; how the particular player compares in batting average; or how the player compares in other statistics categories with other players in the league. [0027] Another embodiment may show the user, for example by arrows, where friends of the user are seated in the stands of the ballpark. This may be implemented, as one example, by the user sending the system a query for the seat location of a friend. Responsive to this query, the system may read the friend's phone number from the user's cell phone contact list and, in conjunction with a global positioning service ("'GPS") satellite system, determine the seat location, generally or specifically, of the cell phone that has the friend's phone number. The system may then, from the above stored ballpark coordinates, respond to the user with information as to the friend's seat location, including, if desired, placing an arrow on the display of the user device pointing to the location of the friend's seat, augmenting the viewer's personal reality of viewing the ballpark live. In an alternate embodiment, instead of GlIPS technology, the system may have stored in its database the names of people who purchased the various seats in the ballpark, and may then, upon request for the seating of a particular person, search for the name of that person to find the person's seat location. Again, if desired, an arrow pointing to the seat location may be placed upon the image of the user device in order to augment the reality of the live viewing of the game. In yet another embodiment, the friend may have posted on a social network such as Facebook the section, row, and seat number where that friend will be sitting. The user may then read this seating off the friend's Facebook wall and send that information to 5 WO 2013/101903 PCT/US2012/071770 the system, for use by the system in augmenting the user device by, as one example, pointing on the device's display to the point in the ballpark where that seating is located. In still another embodiment, directions may be provided based on the user's preferences sent to the system with or as part of the request. For example, if the user requires elevators, the system will point the user in the direction that includes elevators, if the user needs a ramp, the system points to the direction including ramps. Otherwise the system may be free to point out directions using stairs. [0028] In another embodiment, the iPhone or iPad may function as a "set of eyeglasses," providing the user with information relevant to the user's personal choices as the user moves the user device from place to place within the ballpark. For example, as the user walks out of the user's seating section towards the ballpark concourse the user device may, responsive to user query, show the location of ballpark locations that the user may be interested in, for example the closest bathroom, or a bathroom that is outfitted for the handicapped. This could be done by the camera of the user device transmitting to the system images of various locations of the ballpark, such as a marker showing a ballpark section number. As mentioned previously, the system may have previously stored in a database an image of the ballpark, with coordinates for various areas of the ballpark, perhaps keyed to various ballpark markers. The database may be searched using the marker transmitted from the camera, and the desired location may be rendered on the display of the user device along with the pointing information discussed above. [0029] Other embodiments may include the user requesting the system to display the shortest path out of the stadium to where the user parked his or her car. The system may receive a feed of road traffic, and the user could request the parking lot exit from which to leave in order to encounter the least traffic, the fastest way home, or the route that has the least traffic, or the route that passes a particular restaurant, or that passes several restaurants on the way home, and the system could respond with that information for display on the user device, [0030] In any of the above embodiments, as desired, the system may also access the user's purchasing history if, for example, the system is, or is associated with, an ecommerce system or other type of marketing system. The system may then along with, or separate from, the above augmented reality information, transmit advertisements to the user based on that purchasing history. In addition, the advertisements may be based on the location of the ballpark and indicate the location of a store near the ballpark that sells the subject matter of the advertisement. The system may, when sending statistics that were requested for a player, also include an advertisement showing where the jersey or other souvenirs of the player can be purchased, or where team memorabilia, such as championship pennant replicas may be purchased. Since the purchasing history is personal to the user, different users using the above methods would view 6 WO 2013/101903 PCT/US2012/071770 the same game in the same ballpark, but would receive different responses to the same query from the system. [0031] FIG. 1 is a network diagram depicting a network system 100, according to one embodiment, having a client-server architecture configured for exchanging data over a network. For example, the network system 100 may be a publication/publisher system 102 where clients may communicate and exchange data within the network system 100. The data may pertain to various functions (e.g., online item purchases) and aspects (e.g., managing content and user reputation values) associated with the network system 100 and its users. Although illustrated herein as a client-server architecture as an example, other embodiments may include other network architectures, such as a peer-to-peer or distributed network environment. [0032] A data exchange platform, in an example form of a network-based publisher 102, may provide server-side functionality, via a network 104 (e.g., the Internet) to one or more clients. The one or more clients may include users that utilize the network system 100 and more specifically, the network-based publisher 102, to exchange data over the network 114. These transactions may include transmitting, receiving (communicating) and processing data to, from, and regarding content and users of the network system 100. The data may include, but are not limited to, content and user data such as feedback data; user reputation values; user profiles; user attributes; product and service reviews and information, such as pricing and descriptive information; product, service, manufacture, and vendor recommendations and identifiers; product and service listings associated with buyers and sellers; auction bids; and transaction data, among other things. [00331 In various embodiments, the data exchanges within the network system 100 may be dependent upon user-selected functions available through one or more client or Uls. The Uls may be associated with a client machine, such as a client machine 106 using a web client I 10. The web client 110 may be in communication with the network-based publisher 102 via a web server 120. The Uls may also be associated with a client machine 108 using a programmatic client 112, such as a client application, or a mobile device 132 hosting a third party application 116. It may be appreciated in various embodiments the client machine 106, 108, or third party server 114 may be associated with a buyer, a seller, a third party electronic commerce platform, a payment service provider, or a shipping service provider, each in communication with the network-based publisher 102 and optionally each other. The buyers and sellers may be any one of individuals, merchatnts, or service providers, among other things. [0034] A mobile device 132. such as an iPad or an iPhone, as non--limiting examples, may also be in communication with the network- based publisher 102 via a web server 120. The mobile device 132 may include a portable electronic device providing at least some of the functionalities 7 WO 2013/101903 PCT/US2012/071770 of the client machines 106 and 108. The mobile device 132 may include a third party application 116 (or a web client) configured communicate with application server 122. [0035] Turning specifically to the network-based publisher 102, an application program interface (API) server 118 and a web server 120 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 122. The application servers 122 may host one or more publication application (s) 124 and a location-based incentive application 130. The application servers 122 are, in turn, shown to be coupled to one or more database server(s) 126 that facilitate access to one or more database(s) 128. [0036] In one embodiment, the web server 120 and the API server 118 communicate and receive data pertaining to statistics, and feedback, among other things, via various user input tools. For example, the web server 120 may send and receive data to and from a toolbar or webpage on a browser application (e.g., web client 110) operating on a client machine (e.g., client machine 106). The API server 11 8 may send and receive data to and from an application (e.g,, client application 112 or third party application 116) running on other client machines (e.g., client machine 108 or mobile device 132 which, as previously indicated, may be an iPhone or an iPad). [00371 A publication applications) 124 may provide a number of publisher functions and services (e.g., listing, payment, etc.) to users that access the network-based publisher 102. For example, the publication application(s) 124 may provide a number of services and functions to users for listing goods and/or services for sale, facilitating transactions, and reviewing and providing feedback about transactions and associated users. Other publication applications (not shown) within or external to the network based publisher 102 may provide statistics for players, names of parties associated with ballpark seating, traffic information, map information, store locations, and the like. [0038] The third party application 116 may execute on a third party server (not shown) and may have programmatic access to the network-based publisher 102 via the programmatic interface provided by the API server 118. For example, the third party application 116 may use information retrieved from the network-based publisher 102 to support one or more features or functions on a website hosted by the third party. The third party website may, for example, provide one or more listing, feedback, publisher or payment functions that are supported by the relevant applications of the network-based publisher 102. [0039] The network-based publisher 102 may provide a multitude of feedback, reputation, aggregation, and listing and price-setting mechanisms whereby a user may be a seller or buyer who lists or buys goods and/or services (e.g., for sale) published on the network-based publisher 102. 8 WO 2013/101903 PCT/US2012/071770 [00401 The application server 122 also includes a location-based application 130. The location based application 130 communicates advertisements, some of which may offer incentives, to the mobile device 132 based on the mobile device 132 location and the purchase history or other indicated preferences of the user of the mobile device 132 as further described below. [0041] FIG. 2 is a block diagram illustrating an example embodiment of a location-based incentive application 130, which is provided as part of the network-based publisher 102. The location-based incentive application 130 has a location identification module 202, an item identification module 204, and an incentive module 206. The location identification module 202 determines a geographic location of the mobile device 132. The item identification module 204 identifies an item, such as statistics for a player, or for a plurality of players, specified by the user at the geographic location of the mobile device 132. The incentive module 206 connnunicates an incentive from one or more local merchants based on the identified item and the geographic location of the mobile device 132. These modules may be implemented in hardware, firmware, or any combination thereof. As outlined above, database 128 may contain purchase history of the user of the mobile device, as well as the aforementioned location. [00421 In one embodiment, the location-based incentive application 130 receives a communication from the mobile device 132. For example, the communication may include a specification of an item and a location of the mobile device 132. Based on the specified item and the location of the mobile device 132, the incentive module 206 queries to retrieve results from the database server 126 and database 128 to determine and communicate incentives from local merchants to the mobile device 132. [00431 FIG. 3 is a block diagram illustrating an example embodiment of the location identification module 202. The location of the mobile device 132 may be determined in many ways. For example, the mobile device 132 may be equipped with a GPS system that would allow the device to communicate the coordinates or location of the mobile device 132 to a GPS/triangulation module 302 of the location identification module 202. In another example, the location of the mobile device 132 may be determined by triangulation using wireless communication towers and/or wireless nodes (e.(. wi-fi hotspots) within wireless signal reach of the mobile device 132. Based on the geographic coordinates, the GPS/triangulation module 302 of the location identification module 202 may determine the geographic location of the mobile device 132 after consulting a mapping database (not shown). Furthermore, the general location of the mobile device 132 may be located when the user of the mobile device 132 logs onto a local internet connection, for example, at the above ballpark, at a hotel, at a coffee shop, or at any other organization based on location. The Internet Protocol address of the network connection at the ballpark may be uniquely identified by the location of the ballpark. 9 WO 2013/101903 PCT/US2012/071770 [00441 The location identification module 202 may also include a location input module 306 configured to determine a geographic location of the mobile device 132 by requesting the user to input an address, city, zip code or other location information on his/her mobile device 132. In one embodiment, the user may select a location from a list of locations or a map on the mobile device 132. For example, a user on the mobile device 132 inputs the location of the mobile device 132 via an application or a web browser on the mobile device 132, In another embodiment, the location input module 306 derives the geographic location of the user by communicating with third party application using respective APIs (Application Programming Interface). [0045] The location identification module 202 may also include a location-dependent search term module 304. The location of the mobile device 132 may be inferred when the user of the mobile device 132 requests a search on the mobile device 132 using location-dependent search terms. For example, a user inputs a request, sometimes referred herein as a "search query," on his/her mobile device for "Best Japanese Restaurant in my locale," "exit nearest my seat," "exit nearest where my car is parked," and the like. The location-dependent search term module 304 queries and retrieves results from a database (not shown) that may determine the geographic location of the ballpark, the best Japanese restaurant near the user's location, perhaps based on opinions submitted by restaurant customers. Further, the user may request, or the user's history (such as on a social network) may show, a preference for a certain type of Japanese food. The system may then send directions to the closest Japanese restaurant that serves that preferred Japanese food, and not merely the closest Japanese restaurant. The location-dependent search term module 304 may have earlier received from the user the user's seating at the ballpark to use as needed for the above requests, or may have obtained the user's seating by having access to seats purchased by person, which information may also be used for determining the information to be supplied by the system responsive to the request. [0046] The location identification module 202 may also include a tag module 308 configured to determine the geographic location of the mobile device 132 based on a tag associated with a unique geographic location. The tag may include, for example, a barcode tag (e.g. linear barcode or two dimensional bar code) or a Radio Frequency Identification (RFID) tag that is associated with a unique geographic location. For example, the user of the mobile device 132 may use his/her mobile device 132 to scan the tag placed at a landmark or store. The tag may be uniquely associated with the geographic location of the landmark or store. Such relationship may be stored in a database. The tag module 308 may then determine the geographic location of the mobile device 132 based on the tag after consulting the database. 10 WO 2013/101903 PCT/US2012/071770 [00471 FIG. 4 is a block diagram illustrating an example embodiment of an item identification module 204. The item specified by the user of the mobile device 132 may be determined in many ways using any of the following examples of modules: a text identification module 402, an audio identification module 404, a machine-readable symbol module 406, an image identification module 408, and a video identification module 410. The text identification module 402 may identity an item, such as statistics of a given player, or statics of players at a given position, specified by the user at the mobile device 132, in one embodiment using a text input from the user at the mobile device 132. For example, the user may enter a request for a particular player or for a particular position the user wishes. The text identification module 402 may further identify the item by comparing the request for the statistics with a database containing the statistics by player. For example, the user may specify "statistics for David Ortiz" as a category for searching. The text identification module 402 may then identify statistics that correspond to the text that was input by the user. In this case, the text identification module 402 identifies statistics that match the player input by the user (e.g. David Ortiz). [00481 The audio identification module 404 may identify an item or a category of the item as specified by the user at the mobile device using an audio input from the user at the mobile device. For example, the user may speak "statistics" and "David Ortiz" at the location of the mobile device. The audio identification module 404 may include a speech recognition system (not shown) that enables the spoken words of the user to be transcribed into text. [00491 The audio identification module 404 may be used to identify the specified statistics by comparing the name of the player transcribed from the audio with a database containing player statistics. [00501 The machine-readable symbol module 406 may identify an item by having the user scan an RFID chip or any other machine-readable symbol such as a player's uniform number, at a distance, with his/her mobile device 132 as a mnachine-readable symbol reader. For example, the mobile device 132 may include an optical device (e.g. a lens) configured to capture an image of a player. The mobile device 132 may then upload the captured image to the machine-readable symbol module 406. The machine-readable symbol module 406 processes the captured image by querying and retrieving results from a database of machine-readable images to match the captured image of the player symbol with corresponding statistics. The machine-readable symbol module 406 may then identify the statistics specified by the user at the mobile device for rendering at the user device to augment the viewing or watching of the ball game in real time at the ballpark. [0051] The image identification module 408 may identify an item by having the user take a picture of the item with his/her mobile device 132. Mobile devices commonly have an optical II WO 2013/101903 PCT/US2012/071770 lens for capturing pictures. The mobile device 132 may then upload the picture, for example, of the player with certain recognizable features to the image identification module 408. The image identification module 408 analyzes the picture using an image recognition algorithm such as a facial feature recognition algorithm for facial recognition, or a numerical recognition algorithm jersey number recognition (neither algorithm shown) to match the uploaded picture with a corresponding image of an item. The image recognition algorithm consults a database of images and corresponding statistics or other information items to identify the uploaded picture. [0052] The video identification module 410 may be configured to identify an item by having the user take a video of the player with the user's mobile device. Mobile devices commonly have an optical lens to capture video. The mobile device 132 may then upload the video (or a portion of the video) to the video identification module 408. The video identification module 410 analyzes the frames of the video (for example a jersey number or a player's facial image) using an image recognition algorithm (not shown) to match a frame of the video with a corresponding image of the player. The image recognition algorithm may query and retrieve results from a database of images and corresponding items to identify the uploaded video. For example, a user may take a video with his/her mobile device of a player walking on the field. The video identification module 410 recognizes the player and identifies the player's statistics, among other identifying and descriptive information about the player. In any of the above embodiments, the system may also return an advertisement to the user based on the user's purchase history if the system has access to that history. The system may also return an advertisement to the user based on the user's interest, such as an interest in a particular player. The system may also return an advertisement based on the location of the user device. [00531 FIG. 5 is a block diagram illustrating an example embodiment of the incentive module 206 that may used to execute the processes described herein. The incentive module 206 may include a local merchant module 502, an item module 504, an incentive match module 506, a preference module 508, an incentive receiver module 510, and a communication module 514. [0054] 'The local merchant module 502 identifies at least one local merchant having at least one incentive based on the geographic location of the mobile device 132 as determined by the location identification module 202. A local merchant may be a merchant or retailer that is located within a predefined distance from the geographic location of the mobile device 132. In one embodiment, the local merchant module 502 identifies at least one local merchant with at least one incentive based on a search distance preference as specified in preference module 508. [0055] It should be noted that the incentive of the local merchant may or may not correspond to the item identified by the user. For example, a local merchant may feature a special sale on shoes while the identified item corresponds to a digital camera. Once all local merchants having 12 WO 2013/101903 PCT/US2012/071770 incentives are identified based on the geographic location of the mobile device 132 (using a database of incentives), the incentive match module 506 nay filter all local merchants based on the identified item, such as a favorite player indicated by the user sending a request for information with respect to that player. In the previous example, the local merchant featuring a sale on sports memorabilia for the player may be filtered out from the search result. In addition, the filtering can be on the basis of the user's personal shopping history, and/or also preferences. For example, if it is known, such as (in only one example) that the user's Facebook account indicates a number of "Likes" for hats, incentives for hats may be sent. If the information indicates a number of "Dislikes" for shoes, incentives for shoes may not be sent. [0056] The item category module 504 determines the item, in the current example, a player, specified by the user and identified by item identification module 204. The item category module 504 determines that a category of the item related to the player specified by the user falls into the category of athletic memorabilia. [00571 The incentive match module 506 determines whether the identified item category corresponds to a category in at least one incentive of at least one local merchant as determined by the local merchant module 502. For example, a user specifies David Ortiz with his/her mobile device. The item is identified as player David Ortiz. The item identification module 204 generates an identity such as athletic memorabilia relating to David Ortiz. The local merchant module 502 identifies merchants with incentives local to the geographic location of the mobile device 132. The incentive match module 506 matches local merchants with incentives (sale or discount) on the specific memorabilia. [00581 The communication module 514 communicates one or more incentives of the identified item from at least one local merchant to the mobile device 132. For example, a list of local merchants within a preset distance radius (e.g. one mile) of the mobile device 132 is displayed. The list of local merchants may include a sale or discount on the item identified by the user of the mobile device 132. The list may also include a list of recommended merchants (having an incentive on the identified item) that are located beyond the preset distance radius. [0059] In another embodiment, the communication module 514 communicates one or more incentives of the identified category of the items from at least one local merchant to the mobile device 132. For example, a list of local merchants within a preset distance radius (e.g. a block) of the mobile device 132 is displayed. The list of local merchants may include a sale or discount on similar or related items to the identified item specified by the user of the mobile device 132. The list may also include a list of recommended merchants (having an incentive on similar items to the identified item) that are located beyond the preset distance radius. 13 WO 2013/101903 PCT/US2012/071770 [00601 FIG. 6 is a flowchart of a method 600, according to an embodiment, for augmenting personal, real-time, in-person viewing of an event at a location by a user. If an image of the environment, such as a ball park and a ball game, and a request for an item for augmenting the reality of the user's real-time viewing, are received from the user device as at 602, 604 (in any order) the system may detect the requested item and accesses the item data as at 606 for use in augmenting the user's personal reality. This would include steps depending on the request. For example, the user may wish to use the user device to watch a baseball game at a ballpark with augmentation comprising statistics the user is most interested in, such as statistics for a particular player. In this case, the user device, for example an iPhone, might send the image of FIG. 7A with the image of player number 34, David Ortiz, from the game the user is watching. The system may include a video identification system 410 such as discussed above with respect to FIG. 4, that recognizes images of the players, for example, by the player's team uniform number, so that the system may correlate team, uniform number, and player in order to transmit the correct statistics for viewing if so requested. In this case the system may detect from the uniform number (and team, as appropriate) that the player is David Ortiz, and may access the desired statistics for that player. The system may present the requested statistics as augmentation using the augmentation technology discussed above, with a result such as seen in FIG. 7B. The system may, if desired, point to the particular player by arrow when augmeting the reality of watching the game live by presenting the player's statistics. [00611 In another example of using the method of FIG. 6, the requested item at 604 may be a request to show where a friend is seated. As discussed above, the system may use the GPS/triangulation technology 302 of FIG. 3 to locate the requested seating. As an alternative, the system may use seating information provided from the user, or from the friend's Facebook site to locate the seating. The system may then, using the augmentation technology already discussed, or other such appropriate technology, augment the user's view of the real-time in person viewing by placing an arrow, such as that seen in FIG. 9A, pointing to the seating on the display of the user device while the camera of the device is viewing the ballpark, as seen in FIG. 9B. The actual location of the seating may be determined by the system as discussed generally above, by using ballpark coordinates, or by other appropriate locating markers such as ballpark section number and seat numbers, illustrated in FIG. 8, which may already be stored in the system's database. While the ballpark illustrated in FIG. 8 shows only the seating of the ballpark, well known technology may be employed to superimpose a grid of coordinates on the image of FIG. 8 and use the resultant map to locate the seating in order to implement the augmentation under discussion. 14 WO 2013/101903 PCT/US2012/071770 [00621 Also as discussed above with respect to the location-based applications 130, including incentive module 206, the system may use the user's location, to target advisements to the user. Technology for implementing advertisement targeting is discussed in more detail in U.S. Patent Application Serial Number 13/050,769 filed March 17, 2011, and entitled "TARGETED INCENTIVE ACTIONS BASED ON LOCATION AND INTENT," particularly FIGS. 7A-7C and the text associated therewith, The foregoing application is owned by the assignee of the present patent and is hereby Incorporated herein by reference in its entirety. [00631 FIG. 8 shows a diagrammatic representation of machine in the example form of a computer system 800 within which a set of instructions may be executed causing the machine to perform any one or more of the methodologies discussed herein. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term "machine" shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein. [00641 The example computer system 800 includes a processor 802 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 804 and a static memory 806, which communicate with each other via a bus 808. The computer system 800 may further include a video display unit 810 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 800 also includes an alphanumeric input device 812 (e.g., a keyboard), a user interface (UI) navigation device 814 (e.g., a mouse), a disk drive unit 816. a signal generation device 818 (e.g., a speaker) and a network interface device 820. [00651 The disk drive unit 816 includes a machine-readable medium 822 on which is stored one or more sets of instructions and data structures (e.g., software 824) embodying or utilized by any one or more of the methodologies or functions described herein. The software 824 may also reside, completely or at least partially, within the main memory 804 and/or within the processor 802 during execution thereof by the computer system 800, the main memory 804 and the processor 802 also constituting machine-readable media. 15 WO 2013/101903 PCT/US2012/071770 [00661 The software 824 may further be transmitted or received over a network 826 via the network interface device 820 utilizing any one of a number of well-known transfer protocols (e.g., HTTP). [0067] While the machine-readable medium 822 is shown in an example embodiment to be a single medium, the term "machine-readable medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The tern "machine-readable medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure, or that is capable of storing, encoding or carrying data structures utilized by or associated with such a set of instructions. The term "machine readable medium" shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media. [00681 The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim, Rather, as the following claims reflect, the disclosed subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment. 16
权利要求:
Claims (36) [1] 1. A method comprising: receiving., from a user device located at an environment at which a user is viewing an event both with and without the use of the user device, image data containing an image of the environment: receiving a request from the user device for an item of information that relates to the event, for placement into the image of the environment; responsive to receiving the request, accessing a database to retrieve the item of information; generating a scaled image of the item of information based on dimensions of the environment; and transmitting the scaled image for placement into the image of the environment on the user device to generate an augmented reality image. [2] 2. The method of claim 1, further comprising determining the location where the scaled image is to be placed into the image of the environment. [3] 3. The method of claim 1, further comprising determining the location of placement of the scaled image into the image of the environment. [4] 4. The method of claim 1, further comprising determining a location in the environment using a marker located in the environment. [5] 5. The method of claim 1, further comprising: determining an orientation of the placement of the scaled image; and orienting the scaled image to the determined orientation. [6] 6. The method of claim I wherein the image of the environment comprises a video of the environment. [7] The method of claim 6, wherein the placement of the scaled image is being repeatedly performed for the video. 17 WO 2013/101903 PCT/US2012/071770 [8] 8. The method of claim 1, further comprising providing the augmented reality image to a display device of the user. [9] 9. The method of claim 1, further comprising providing additional information with the item of information in conjunction with the augmented reality image. [10] 10. The method of claim 9, wherein the additional information comprising the image of an arrow or other location indicator. [11] I. The method of claim 10 wherein the arrow or other location indicator includes further image information. [12] 12. The method of claim 1 wherein the event comprises a sporting event played by players and the item of information comprises statistics relating to at least one of the players. [13] 13. The method of claim 12 wherein the sporting event is a ball game played by teams of players in a ballpark and the item of information comprises one of the group consisting of statistics of the players, statistics of relevant teams, statistics by positions played by the players of the relevant teams, pitching rotation of pitchers playing for the relevant teams, current league standings of the relevant teams, and possible playoff scenarios for the relevant tears. [14] 14. The method of claim I wherein the image of the environment includes recognizable objects, and identity of at least one of the objects is recognized by one of the group consisting of RFID detection, GPS, triangulation, and facial recognition. [15] 15. The method of claim 12 wherein the sporting event includes using a ball that includes an RFID chip, and RFID detection enables display of a trajectory of the ball. [16] 16. The method of claim 12 wherein the item of information comprises one of the group consisting of the location in the ball park of: a seat, a person, a selected vendor store, a desired facility, a desired path to a parking area, and a route from the ballpark to a desired location separate from the location of the environment. [17] 17. The method of claim 1, further including targeting an advertisement to the user device based on the user's purchase history, interest, or location. 18 WO 2013/101903 PCT/US2012/071770 [18] 18. Computer-readable storage having embedded therein a set of instructions which, when executed by one or more processors of a computer, causes the computer to execute the following operations: receiving, from a user device located at an environment at which a user is viewing an event both with and without the use of the user device, iniage data containing an iniage of the environment; receiving a request from the user device for an item of information that relates to the event for placement into the image of the environment; responsive to receiving the request, accessing a database to retrieve the item of information; generating a scaled image of the item of information based on dimensions of the environment; and transmitting the scaled image for placement into the image of the environment on the user device to generate an augmented reality image. [19] 19. The computer storage of claim 18, the operations further including determining the location where the scaled image is to be placed into the image of the environment. [20] 20. The computer storage of claim 18, the operations further including determining the location of placement of the scaled image into the image of the environment. [21] 21. The computer storage of claim 18, the operations further including determining a location in the environment using a marker located in the environment. [22] 22 The computer storage of clamn 18, the operations further including: determining an orientation of the placement of the scaled image; and orienting the scaled image to the determined orientation. [23] 23. The computer storage of claim 18 wherein the image of the environment comprises a video of the environment. [24] 24. The computer storage of claim 23, the placement of the scaled image being repeatedly performed for the video. 19 WO 2013/101903 PCT/US2012/071770 [25] 25. The computer storage of claim 18, the operations further including providing the augmented reality image to a display device of the user. [26] 26. The computer storage of claim 18, the operations further including providing additional information with the item of information in conjunction with the augmented reality image, [27] 27. The computer storage of claim 26, the additional information comprising the image of an arrow or other location indicator. [28] 28. The computer storage of claim 27 wherein the arrow or other location indicator includes further image information. [29] 29. The computer storage of clahn 18 wherein the event comprises a sporting event played by players and the item of information comprises statistics relating to at least one of the players. [30] 30. The computer storage of claim 29 wherein the sporting event is a ball game played by teams of players in a ballpark and the item of information comprises one of the group consisting of statistics of the players, statistics of relevant teams: statistics of players by position played by players of the relevant teams, pitching rotation of pitchers playing for the relevant teams, current league standings of the relevant teams, and possible playoff scenarios for the relevant teams. [31] 31. The computer storage of claim 18 wherein the image of the environment includes recognizable objects, and identity of at least one of the objects is recognized by one of the group consisting of RFID detection, GPS, triangulation, and facial recognition. [32] 32. The computer storage of claim 30 wherein the sporting event includes using a ball that includes an RFID chip, and RFID detection enables display of a trajectory of the ball. [33] 33. The computer storage of claim 29 wherein the item of information comprises one of the group consisting of the location in the ball park of: a seat, a person, a selected vendor store, a desired facility, a desired path to a parking area, and a route from the ballpark to a desired location separate from the location of the environment. [34] 34. The computer storage of claim 18, the operations further including targeting an advertisement to the user device based on the user's purchase history, interest, or location. 20 WO 2013/101903 PCT/US2012/071770 [35] 35. A system comprising at least one processor and computer storage configured to execute: a receiving module to receive, from a user device located at an environment at which a user is viewing an event both with and without the use of the user device, image data containing an image of an environment, the receiving module also to receive a request from the user device for an item of information that relates to the event for placement into a location of the image of the environment; a database accessing module to, responsive to receipt of the request, access a database to retrieve the item of information: an image generating module to generate a scaled image of the item of information based on dimensions of the environment; and a transmitting module to transmit the scaled image for placement into the image of the environment on the user device to generate an augmented reality image. [36] 36. The system of claim 35 further including a targeting module to target an advertisement to the user device based on the user's purchase history, interest, or location. 21
类似技术:
公开号 | 公开日 | 专利标题 US10614602B2|2020-04-07|Personal augmented reality DK2686820T3|2017-01-09|Video Processing System to identify elements of video frames US8868443B2|2014-10-21|Targeted incentive actions based on location and intent US20140012666A1|2014-01-09|Transferring digital media rights in social network environment AU2015271902B2|2017-09-07|Personal augmented reality US20130054370A1|2013-02-28|System and method for communication based on location KR20110106165A|2011-09-28|Apparatus and method for providing products information, and system and method for managing shopping mall US10002370B2|2018-06-19|Systems and methods for creating a navigable path between pages of a network platform based on linking database entries of the network platform US11049176B1|2021-06-29|Systems/methods for identifying products within audio-visual content and enabling seamless purchasing of such identified products by viewers/users of the audio-visual content US10861037B1|2020-12-08|System and method for incorporating cross platform metrics for increased user engagement US20220044269A1|2022-02-10|System and method for contactless sales using location-based services
同族专利:
公开号 | 公开日 AU2012362467B2|2015-09-17| CA2856869C|2017-09-19| US10614602B2|2020-04-07| EP2798628A2|2014-11-05| US9530059B2|2016-12-27| WO2013101903A2|2013-07-04| EP2798628A4|2016-01-20| US9240059B2|2016-01-19| US20160171305A1|2016-06-16| WO2013101903A3|2014-06-12| US20170091975A1|2017-03-30| CA2856869A1|2013-07-04| US20200193668A1|2020-06-18| US20130170697A1|2013-07-04|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 FR10582E|1970-06-29|1909-07-30|Paul Alexis Victor Lerolle|Lock set with master key| US4539585A|1981-07-10|1985-09-03|Spackova Daniela S|Previewer| CA1214858A|1984-09-27|1986-12-02|Stanley Panton|Acoustic ranging system| US5408417A|1992-05-28|1995-04-18|Wilder; Wilford B.|Automated ticket sales and dispensing system| US5579471A|1992-11-09|1996-11-26|International Business Machines Corporation|Image query system and method| US5870149A|1993-03-12|1999-02-09|Motorola, Inc.|Video/integrated land mobile dispatch radio and video unit| US7859551B2|1993-10-15|2010-12-28|Bulman Richard L|Object customization and presentation system| US5889896A|1994-02-09|1999-03-30|Meshinsky; John|System for performing multiple processes on images of scanned documents| US5546475A|1994-04-29|1996-08-13|International Business Machines Corporation|Produce recognition system| JP2776295B2|1994-10-27|1998-07-16|日本電気株式会社|Image index generation method and image index generation device| US6112226A|1995-07-14|2000-08-29|Oracle Corporation|Method and apparatus for concurrently encoding and tagging digital information for allowing non-sequential access during playback| US6947571B1|1999-05-19|2005-09-20|Digimarc Corporation|Cell phones with optical capabilities, and related applications| US6714945B1|1995-11-17|2004-03-30|Sabre Inc.|System, method, and article of manufacture for propagating transaction processing facility based data and for providing the propagated data to a variety of clients| US6434530B1|1996-05-30|2002-08-13|Retail Multimedia Corporation|Interactive shopping system with mobile apparatus| US6134674A|1997-02-28|2000-10-17|Sony Corporation|Computer based test operating system| WO1999005630A1|1997-07-23|1999-02-04|Matthias John T|Coupon dispenser and method for generating coupons| US20020116286A1|1997-10-09|2002-08-22|Walker Jay S.|Method and apparatus for utilizing demand information at a vending machine| JP3500940B2|1997-12-25|2004-02-23|カシオ計算機株式会社|Product image data processing device| KR100365664B1|1998-02-03|2003-01-24|츄요시 사이고|Eyeglasses try-on simulation system| US6278446B1|1998-02-23|2001-08-21|Siemens Corporate Research, Inc.|System for interactive organization and browsing of video| JPH11250071A|1998-02-26|1999-09-17|Minolta Co Ltd|Image database constructing method, image database device and image information storage medium| US20030112260A1|1998-02-27|2003-06-19|Tomonobu Gouzu|Information retrieval system and information processing system| EP0980042A4|1998-02-27|2001-05-30|Mitsubishi Electric Corp|Information retrieval system and information processing system| US7495674B2|1998-05-27|2009-02-24|Advanced Testing Technologies, Inc.|Video generation and capture techniques| US6157435A|1998-05-29|2000-12-05|Eastman Kodak Company|Image processing| US6216134B1|1998-06-25|2001-04-10|Microsoft Corporation|Method and system for visualization of clusters and classifications| US6216227B1|1998-06-29|2001-04-10|Sun Microsystems, Inc.|Multi-venue ticketing using smart cards| US6134548A|1998-11-19|2000-10-17|Ac Properties B.V.|System, method and article of manufacture for advanced mobile bargain shopping| US6512919B2|1998-12-14|2003-01-28|Fujitsu Limited|Electronic shopping system utilizing a program downloadable wireless videophone| US6055573A|1998-12-30|2000-04-25|Supermarkets Online, Inc.|Communicating with a computer based on an updated purchase behavior classification of a particular consumer| US6567797B1|1999-01-26|2003-05-20|Xerox Corporation|System and method for providing recommendations based on multi-modal user clusters| JP4794708B2|1999-02-04|2011-10-19|オリンパス株式会社|3D position and orientation sensing device| US7027993B1|1999-03-12|2006-04-11|International Business Machines Corporation|Computerized knowledge brokerage system| US6477269B1|1999-04-20|2002-11-05|Microsoft Corporation|Method and system for searching for images based on color and shape of a selected image| US6563959B1|1999-07-30|2003-05-13|Pixlogic Llc|Perceptual similarity image retrieval method| US6589290B1|1999-10-29|2003-07-08|America Online, Inc.|Method and apparatus for populating a form with data| US6446045B1|2000-01-10|2002-09-03|Lucinda Stone|Method for using computers to facilitate and control the creating of a plurality of functions| US20010034668A1|2000-01-29|2001-10-25|Whitworth Brian L.|Virtual picture hanging via the internet| US6587835B1|2000-02-09|2003-07-01|G. Victor Treyz|Shopping assistance with handheld computing device| US7346543B1|2000-02-24|2008-03-18|Edmark Tomima L|Virtual showroom method| US7428505B1|2000-02-29|2008-09-23|Ebay, Inc.|Method and system for harvesting feedback and comments regarding multiple items from users of a network-based transaction facility| JP2001283079A|2000-03-28|2001-10-12|Sony Corp|Communication service method, its device, communication terminal unit, communication system and advertisement publicizing method| US7149665B2|2000-04-03|2006-12-12|Browzwear International Ltd|System and method for simulation of virtual wear articles on virtual models| US7343293B1|2000-04-14|2008-03-11|Sony Corporation|Responding to request for data| JP4427697B2|2000-04-17|2010-03-10|ソニー株式会社|Information providing system, information transmitting apparatus, information receiving apparatus and method thereof| AU5163401A|2000-04-17|2001-10-30|Emtera Corp|System and method for wireless purchases of goods and services| US20030208409A1|2001-04-30|2003-11-06|Mault James R.|Method and apparatus for diet control| US8352331B2|2000-05-03|2013-01-08|Yahoo! Inc.|Relationship discovery engine| US20020002504A1|2000-05-05|2002-01-03|Andrew Engel|Mobile shopping assistant system and device| WO2001088654A2|2000-05-18|2001-11-22|Visionix Ltd.|Spectacles fitting system and fitting methods| JP2001344479A|2000-05-31|2001-12-14|Sony Corp|Sale system, server device, terminal device, and information processing method| US7526440B2|2000-06-12|2009-04-28|Walker Digital, Llc|Method, computer product, and apparatus for facilitating the provision of opinions to a shopper from a panel of peers| US7487112B2|2000-06-29|2009-02-03|Barnes Jr Melvin L|System, method, and computer program product for providing location based services and mobile e-commerce| US6901379B1|2000-07-07|2005-05-31|4-D Networks, Inc.|Online shopping with virtual modeling and peer review| US6530521B1|2000-07-17|2003-03-11|Ncr Corporation|Produce recognition apparatus and method of obtaining information about produce items| US20020094189A1|2000-07-26|2002-07-18|Nassir Navab|Method and system for E-commerce video editing| US20040075670A1|2000-07-31|2004-04-22|Bezine Eric Camille Pierre|Method and system for receiving interactive dynamic overlays through a data stream and displaying it over a video content| US7062722B1|2000-08-22|2006-06-13|Bruce Carlin|Network-linked interactive three-dimensional composition and display of saleable objects in situ in viewer-selected scenes for purposes of promotion and procurement| KR100374034B1|2000-09-05|2003-02-26|삼성전자주식회사|Self-healing device of optical receiver and method thereof| US7254779B1|2000-09-06|2007-08-07|Xanboo Inc.|Customizable presentation environment| US6697761B2|2000-09-19|2004-02-24|Olympus Optical Co., Ltd.|Three-dimensional position/orientation sensing apparatus, information presenting system, and model error detecting system| JP2002099826A|2000-09-21|2002-04-05|Matsushita Electric Ind Co Ltd|Catalogue sales system| JP2002176671A|2000-09-28|2002-06-21|Takashi Fujimoto|Mobile phone| US8234218B2|2000-10-10|2012-07-31|AddnClick, Inc|Method of inserting/overlaying markers, data packets and objects relative to viewable content and enabling live social networking, N-dimensional virtual environments and/or other value derivable from the content| US7890386B1|2000-10-27|2011-02-15|Palisades Technology, Llc|Method for use with a wireless communication device for facilitating tasks using images and selections| US6804662B1|2000-10-27|2004-10-12|Plumtree Software, Inc.|Method and apparatus for query and analysis| WO2002039216A2|2000-11-03|2002-05-16|Outlet Group, Llc|Method and system of an integrated business topography and virtual 3d network portal| US8130242B2|2000-11-06|2012-03-06|Nant Holdings Ip, Llc|Interactivity via mobile image recognition| US6763148B1|2000-11-13|2004-07-13|Visual Key, Inc.|Image recognition methods| US6484130B2|2000-11-29|2002-11-19|International Business Machines Corporation|Office accessibility information provider| JP2002183542A|2000-12-15|2002-06-28|Masahiko Okuno|Virtual trial wearing system, virtual try-on method, storage medium for virtual try-on wearing program| US6993553B2|2000-12-19|2006-01-31|Sony Corporation|Data providing system, data providing apparatus and method, data acquisition system and method, and program storage medium| US7130466B2|2000-12-21|2006-10-31|Cobion Ag|System and method for compiling images from a database and comparing the compiled images with known images| JP2002207781A|2001-01-11|2002-07-26|Victor Interactive Software:Kk|Method for selecting installed article, computer program for realizing method, and recording medium recording computer program| US20020111154A1|2001-02-14|2002-08-15|Eldering Charles A.|Location based delivery| JP2002318926A|2001-04-19|2002-10-31|Matsushita Electric Works Ltd|Sales support method for freely designed storage shelf using internet| US6742003B2|2001-04-30|2004-05-25|Microsoft Corporation|Apparatus and accompanying methods for visualizing clusters of data and hierarchical cluster classifications| JP2003022395A|2001-07-05|2003-01-24|Matsushita Electric Ind Co Ltd|Merchandise related information providing system and merchandise sales system| AT497227T|2001-08-02|2011-02-15|Intellocity Usa Inc|PREPARATION OF DISPLAY CHANGES| US7082365B2|2001-08-16|2006-07-25|Networks In Motion, Inc.|Point of interest spatial rating search method and system| JP3997749B2|2001-10-22|2007-10-24|ソニー株式会社|Signal processing method and apparatus, signal processing program, and recording medium| JP3964646B2|2001-10-25|2007-08-22|富士フイルム株式会社|Image processing method, image processing apparatus, and image signal creation method| US7636874B2|2001-11-16|2009-12-22|Sap Ag|Method and apparatus for computer-implemented processing of payment entries| US7953648B2|2001-11-26|2011-05-31|Vock Curtis A|System and methods for generating virtual clothing experiences| JP2004048674A|2002-05-24|2004-02-12|Olympus Corp|Information presentation system of visual field agreement type, portable information terminal, and server| US7568004B2|2002-06-20|2009-07-28|Linda Gottfried|Method and system for sharing brand information| US7127491B2|2002-07-23|2006-10-24|Canon Kabushiki Kaisha|Remote command server| US7231087B2|2002-10-30|2007-06-12|Metrica, Inc.|Matching binary templates against range map derived silhouettes for object pose estimation| GB0229625D0|2002-12-19|2003-01-22|British Telecomm|Searching images| JP4580148B2|2003-03-14|2010-11-10|ソニー株式会社|Information processing apparatus and metadata display method| US20040205286A1|2003-04-11|2004-10-14|Bryant Steven M.|Grouping digital images using a digital camera| JP2004318359A|2003-04-15|2004-11-11|Casio Comput Co Ltd|Merchandise selling device, merchandising selling system, and program| JP2004326229A|2003-04-22|2004-11-18|Matsushita Electric Ind Co Ltd|Information processor, server device and its program| JP4346950B2|2003-05-02|2009-10-21|キヤノン株式会社|Information processing method and apparatus| CN100416336C|2003-06-12|2008-09-03|美国西门子医疗解决公司|Calibrating real and virtual views| US7116342B2|2003-07-03|2006-10-03|Sportsmedia Technology Corporation|System and method for inserting content into an image sequence| EP1654622A4|2003-08-04|2008-04-23|Ebay Inc|Method and apparatus for deploying high-volume listings in a network trading platform| US7363214B2|2003-08-08|2008-04-22|Cnet Networks, Inc.|System and method for determining quality of written product reviews in an automated manner| KR100969966B1|2003-10-06|2010-07-15|디즈니엔터프라이지즈,인크.|System and method of playback and feature control for video players| US7277572B2|2003-10-10|2007-10-02|Macpearl Design Llc|Three-dimensional interior design system| WO2005055138A2|2003-11-26|2005-06-16|Yesvideo, Inc.|Statical modeling of a visual image for use in determining similarity between visual images| US7796155B1|2003-12-19|2010-09-14|Hrl Laboratories, Llc|Method and apparatus for real-time group interactive augmented-reality area monitoring, suitable for enhancing the enjoyment of entertainment events| US7693325B2|2004-01-14|2010-04-06|Hexagon Metrology, Inc.|Transprojection of geometry data| EP2317449A3|2004-01-16|2011-07-13|Hillcrest Laboratories, Inc.|Metadata brokering server and methods| US7872669B2|2004-01-22|2011-01-18|Massachusetts Institute Of Technology|Photo-based mobile deixis system and related techniques| KR20070007799A|2004-02-12|2007-01-16|비숀 알리반디|System and method for producing merchandise from a virtual environment| US7565139B2|2004-02-20|2009-07-21|Google Inc.|Image-based search engine for mobile phones with camera| US20050222987A1|2004-04-02|2005-10-06|Vadon Eric R|Automated detection of associations between search criteria and item categories based on collective analysis of user activity data| JP4298577B2|2004-05-06|2009-07-22|三菱電機株式会社|Vehicle alarm device| US7281018B1|2004-05-26|2007-10-09|Microsoft Corporation|Form template data source change| JP4114637B2|2004-05-28|2008-07-09|富士ゼロックス株式会社|Position measurement system| US7562069B1|2004-07-01|2009-07-14|Aol Llc|Query disambiguation| US7347373B2|2004-07-08|2008-03-25|Scenera Technologies, Llc|Method and system for utilizing a digital camera for retrieving and utilizing barcode information| WO2006014824A1|2004-07-26|2006-02-09|Wireless 5Th Dimensional Networking, Inc.|Context-based search engine residing on a network| US20060058948A1|2004-07-28|2006-03-16|Ms. Melanie Blass|Recordable location-based reminder system organizer| WO2006015492A2|2004-08-13|2006-02-16|Dofasco Inc.|Remote crane bar code system| US8547401B2|2004-08-19|2013-10-01|Sony Computer Entertainment Inc.|Portable augmented reality device and method| US7853564B2|2004-09-17|2010-12-14|Adobe Systems Incorporated|Adding metadata to a stock content item| US7460735B1|2004-09-28|2008-12-02|Google Inc.|Systems and methods for using image duplicates to assign labels to images| JP4708752B2|2004-09-28|2011-06-22|キヤノン株式会社|Information processing method and apparatus| JP4689380B2|2004-09-28|2011-05-25|キヤノン株式会社|Information processing method and apparatus| US20060116935A1|2004-11-24|2006-06-01|Evans Charles E|Cell phone based product research| US20060120686A1|2004-12-03|2006-06-08|Frank Liebenow|Method, apparatus and system for storage and retrieval of images| WO2006065563A2|2004-12-14|2006-06-22|Sky-Trax Incorporated|Method and apparatus for determining position and rotational orientation of an object| US7409362B2|2004-12-23|2008-08-05|Diamond Review, Inc.|Vendor-driven, social-network enabled review system and method with flexible syndication| US20060149625A1|2004-12-30|2006-07-06|Ross Koningstein|Suggesting and/or providing targeting information for advertisements| US20060149638A1|2005-01-06|2006-07-06|Allen Anita L|Electronic personalized clothing makeover assistant| JP2006209658A|2005-01-31|2006-08-10|Bandai Co Ltd|Device and method for display output and computer program| US9436945B2|2005-02-01|2016-09-06|Redfin Corporation|Interactive map-based search and advertising| JP2006244329A|2005-03-07|2006-09-14|Hitachi Ltd|Portable terminal, information processor, and system| US20060218153A1|2005-03-28|2006-09-28|Voon George H H|Building social networks using shared content data relating to a common interest| US7519562B1|2005-03-31|2009-04-14|Amazon Technologies, Inc.|Automatic identification of unreliable user ratings| EP1866043A1|2005-04-06|2007-12-19|Eidgenössische Technische Hochschule Zürich |Method of executing an application in a mobile device| US8732025B2|2005-05-09|2014-05-20|Google Inc.|System and method for enabling image recognition and searching of remote content on display| US20080177640A1|2005-05-09|2008-07-24|Salih Burak Gokturk|System and method for using image analysis and search in e-commerce| US7848765B2|2005-05-27|2010-12-07|Where, Inc.|Location-based services| KR101350888B1|2005-07-14|2014-01-14|찰스 디. 휴스턴|Gps based spectator and participant sport system and method| US7761400B2|2005-07-22|2010-07-20|John Reimer|Identifying events| KR20070014532A|2005-07-29|2007-02-01|주식회사 팬택|System and method for ordering goods using short message service during watching digital multimedia broadcasting service| US20110153614A1|2005-08-01|2011-06-23|Worthwhile Products|Inventory control system process| US20110258049A1|2005-09-14|2011-10-20|Jorey Ramer|Integrated Advertising System| US20110143731A1|2005-09-14|2011-06-16|Jorey Ramer|Mobile Communication Facility Usage Pattern Geographic Based Advertising| US7801893B2|2005-09-30|2010-09-21|Iac Search & Media, Inc.|Similarity detection and clustering of images| US20070133947A1|2005-10-28|2007-06-14|William Armitage|Systems and methods for image search| US20090281909A1|2006-12-06|2009-11-12|Pumpone, Llc|System and method for management and distribution of multimedia presentations| US7606581B2|2005-12-13|2009-10-20|Yahoo! Inc.|System and method for providing geo-relevant information based on a location| US7457730B2|2005-12-15|2008-11-25|Degnan Donald A|Method and system for virtual decoration| US8121610B2|2006-03-31|2012-02-21|Research In Motion Limited|Methods and apparatus for associating mapping functionality and information in contact lists of mobile communication devices| US8117246B2|2006-04-17|2012-02-14|Microsoft Corporation|Registering, transfering, and acting on event metadata| KR20090028713A|2006-05-19|2009-03-19|마이 버추얼 모델 아이엔씨.|Simulation-assisted search| US20080005313A1|2006-06-29|2008-01-03|Microsoft Corporation|Using offline activity to enhance online searching| US7711609B1|2006-07-13|2010-05-04|Gofigure Media, Llc|System and method for placing products or services and facilitating purchase| US20080046738A1|2006-08-04|2008-02-21|Yahoo! Inc.|Anti-phishing agent| US10003781B2|2006-08-04|2018-06-19|Gula Consulting Limited Liability Company|Displaying tags associated with items in a video playback| EP1887526A1|2006-08-11|2008-02-13|Seac02 S.r.l.|A digitally-augmented reality video system| US7813561B2|2006-08-14|2010-10-12|Microsoft Corporation|Automatic classification of objects within images| US20080071559A1|2006-09-19|2008-03-20|Juha Arrasvuori|Augmented reality assisted shopping| US20080084429A1|2006-10-04|2008-04-10|Sherman Locke Wissinger|High performance image rendering for internet browser| US20080109841A1|2006-10-23|2008-05-08|Ashley Heather|Product information display and product linking| US7512605B2|2006-11-01|2009-03-31|International Business Machines Corporation|Document clustering based on cohesive terms| WO2008076138A2|2006-12-18|2008-06-26|The Nielsen Company|Methods and systems to meter point-of-purchase conduct with a wireless communication device equipped with a camera| US20080147325A1|2006-12-18|2008-06-19|Maassel Paul W|Method and system for providing augmented reality| US20080154710A1|2006-12-21|2008-06-26|Pradeep Varma|Minimal Effort Prediction and Minimal Tooling Benefit Assessment for Semi-Automatic Code Porting| WO2008082891A2|2006-12-29|2008-07-10|Echostar Technologies Corporation|Incremental transmission of data| KR101329289B1|2007-01-05|2013-11-14|삼성전자주식회사|Apparatus and method for providing schedule and path| CN101226596B|2007-01-15|2012-02-01|夏普株式会社|Document image processing apparatus and document image processing process| JP2008191751A|2007-02-01|2008-08-21|Dainippon Printing Co Ltd|Arrangement simulation system| US20080207357A1|2007-02-09|2008-08-28|Chris Savarese|Combined range and tag finder| US8103115B2|2007-02-26|2012-01-24|Sony Corporation|Information processing apparatus, method, and program| US7881984B2|2007-03-30|2011-02-01|Amazon Technologies, Inc.|Service for providing item recommendations| US8958661B2|2007-03-30|2015-02-17|Intel Corporation|Learning concept templates from web images to query personal image databases| KR100856585B1|2007-04-04|2008-09-03|주식회사 엘지데이콤|Interactive television system| GB0707216D0|2007-04-14|2007-05-23|Livesey Carl|Interactive shopping platform| US8386923B2|2007-05-08|2013-02-26|Canon Kabushiki Kaisha|Document generation apparatus, method, and storage medium| US8694379B2|2007-05-14|2014-04-08|Microsoft Corporation|One-click posting| US8412021B2|2007-05-18|2013-04-02|Fall Front Wireless Ny, Llc|Video player user interface| US20090006208A1|2007-06-26|2009-01-01|Ranjit Singh Grewal|Display of Video with Tagged Advertising| US7725362B2|2007-07-12|2010-05-25|Qualcomm Incorporated|Virtual group shopping mall| US9609260B2|2007-07-13|2017-03-28|Gula Consulting Limited Liability Company|Video tag layout| EP2020746A1|2007-08-02|2009-02-04|Grundfos Management A/S|Method for actuating an asynchronous motor| US9569806B2|2007-09-04|2017-02-14|Apple Inc.|Dynamic presentation of location-specific information| US20090083134A1|2007-09-20|2009-03-26|Burckart Erik J|Adaptive Advertising Based On Social Networking Preferences| US20090083096A1|2007-09-20|2009-03-26|Microsoft Corporation|Handling product reviews| DE102007045834B4|2007-09-25|2012-01-26|Metaio Gmbh|Method and device for displaying a virtual object in a real environment| US7627502B2|2007-10-08|2009-12-01|Microsoft Corporation|System, method, and medium for determining items to insert into a wishlist by analyzing images provided by a user| US20090109240A1|2007-10-24|2009-04-30|Roman Englert|Method and System for Providing and Reconstructing a Photorealistic Three-Dimensional Environment| US9147213B2|2007-10-26|2015-09-29|Zazzle Inc.|Visualizing a custom product in situ| KR101146091B1|2007-11-30|2012-05-16|광주과학기술원|Input Interface Device for Augmented Reality, and Augmented Reality System Therewith| US20090262137A1|2008-01-10|2009-10-22|Walker Jay S|Systems and methods for presenting prediction in a broadcast| US20090182810A1|2008-01-16|2009-07-16|Yahoo! Inc.|System and Method for Real-Time Media Object-Specific Communications| CA2712287A1|2008-02-01|2009-08-06|Innovation Studios Pty Ltd|Method for online selection of items and an online shopping system using the same| US20110001758A1|2008-02-13|2011-01-06|Tal Chalozin|Apparatus and method for manipulating an object inserted to video content| US7756757B1|2008-03-05|2010-07-13|United Services Automobile Association |Systems and methods for price searching and intelligent shopping lists on a mobile device| EP3239919A1|2008-03-05|2017-11-01|eBay Inc.|Method and apparatus for image recognition services| US9495386B2|2008-03-05|2016-11-15|Ebay Inc.|Identification of items depicted in images| US8098881B2|2008-03-11|2012-01-17|Sony Ericsson Mobile Communications Ab|Advertisement insertion systems and methods for digital cameras based on object recognition| US10122845B2|2008-03-11|2018-11-06|Nitesh Ratnakar|Location based personal organizer| US20100211900A1|2009-02-17|2010-08-19|Robb Fujioka|Virtual Marketplace Accessible To Widgetized Avatars| CN101541012A|2008-03-21|2009-09-23|夏普株式会社|Interference overload indication generating method, interference overload indication triggering method, interference overload indication method and base station| US20090271293A1|2008-04-28|2009-10-29|Interactive Luxury Solutions Llc|Methods and systems for dynamically generating personalized shopping suggestions| US7707073B2|2008-05-15|2010-04-27|Sony Ericsson Mobile Communications, Ab|Systems methods and computer program products for providing augmented shopping information| US8447643B2|2008-06-02|2013-05-21|Melvin L. Barnes, Jr.|System and method for collecting and distributing reviews and ratings| US8161379B2|2008-06-20|2012-04-17|Microsoft Corporation|Fit and fill techniques for pictures| US20090319388A1|2008-06-20|2009-12-24|Jian Yuan|Image Capture for Purchases| US20090319373A1|2008-06-23|2009-12-24|Microsoft Corporation|National advertisement linking| US8788493B2|2008-06-30|2014-07-22|Verizon Patent And Licensing Inc.|Digital image tagging apparatuses, systems, and methods| US8260846B2|2008-07-25|2012-09-04|Liveperson, Inc.|Method and system for providing targeted content to a surfer| JP2010039908A|2008-08-07|2010-02-18|Sugisaki Fusako|Curtain hanging simulation system, curtain hanging simulation method and program| US20100048290A1|2008-08-19|2010-02-25|Sony Computer Entertainment Europe Ltd.|Image combining method, system and apparatus| US20100045701A1|2008-08-22|2010-02-25|Cybernet Systems Corporation|Automatic mapping of augmented reality fiducials| TWI375177B|2008-09-10|2012-10-21|Univ Nat Taiwan|System and method for inserting advertising content| KR101537018B1|2008-10-01|2015-07-17|삼성전자주식회사|Secure Memory Interface, System and Smart Card Including the same| JP2010141371A|2008-10-08|2010-06-24|Avanty:Kk|System for assisting creation/playback of animation content| US20100153378A1|2008-12-12|2010-06-17|Sardesai Prashant|Online Pair Wise Comparison and Recommendation System| KR20100067921A|2008-12-12|2010-06-22|강동우|The area infomercial service method which applies a ip home server| KR20100071559A|2008-12-19|2010-06-29|에스케이텔레콤 주식회사|Electronic commerce linkage system using interactive advertisement contents and method thereof| US8886636B2|2008-12-23|2014-11-11|Yahoo! Inc.|Context transfer in search advertising| US8606657B2|2009-01-21|2013-12-10|Edgenet, Inc.|Augmented reality method and system for designing environments and buying/selling goods| WO2010084585A1|2009-01-21|2010-07-29|Fukuda Tatsuya|Information guidance system| US8411086B2|2009-02-24|2013-04-02|Fuji Xerox Co., Ltd.|Model creation using visual markup languages| US10417675B2|2009-03-11|2019-09-17|Ebay Inc.|System and method for providing user interfaces for fashion selection| US8825660B2|2009-03-17|2014-09-02|Ebay Inc.|Image-based indexing in a network-based marketplace| CN101520904B|2009-03-24|2011-12-28|上海水晶石信息技术有限公司|Reality augmenting method with real environment estimation and reality augmenting system| US8527658B2|2009-04-07|2013-09-03|Verisign, Inc|Domain traffic ranking| US8411986B2|2009-04-13|2013-04-02|Flashfoto, Inc.|Systems and methods for segmenation by removal of monochromatic background with limitied intensity variations| US8289185B2|2009-05-05|2012-10-16|Advanced Technologies Group, LLC|Sports telemetry system for collecting performance metrics and data| WO2010141939A1|2009-06-05|2010-12-09|Mozaik Multimedia, Inc.|Ecosystem for smart content tagging and interaction| US20110004517A1|2009-06-26|2011-01-06|The Jungle U LLC|Dialogue advertising| US20100332283A1|2009-06-29|2010-12-30|Apple Inc.|Social networking in shopping environments| US8275590B2|2009-08-12|2012-09-25|Zugara, Inc.|Providing a simulation of wearing items such as garments and/or accessories| CA2812523C|2009-09-25|2021-03-16|Avazap Inc.|Frameless video system| US20110084983A1|2009-09-29|2011-04-14|Wavelength & Resonance LLC|Systems and Methods for Interaction With a Virtual Environment| US20120284105A1|2009-10-13|2012-11-08|Ezsav Inc.|Apparatuses, methods, and computer program products enabling association of related product data and execution of transaction| US8817078B2|2009-11-30|2014-08-26|Disney Enterprises, Inc.|Augmented reality videogame broadcast programming| JP2011118834A|2009-12-07|2011-06-16|Sony Corp|Apparatus and method for processing information, and program| US9164577B2|2009-12-22|2015-10-20|Ebay Inc.|Augmented reality system, method, and apparatus for displaying an item image in a contextual environment| KR101055985B1|2010-01-12|2011-08-11|인크로스 주식회사|Information provision method and information service method using a terminal equipped with a camera| US8990124B2|2010-01-14|2015-03-24|Microsoft Technology Licensing, Llc|Assessing quality of user reviews| US20110184780A1|2010-01-21|2011-07-28|Ebay Inc.|INTEGRATION OF eCOMMERCE FEATURES INTO SOCIAL NETWORKING PLATFORM| US20140063054A1|2010-02-28|2014-03-06|Osterhout Group, Inc.|Ar glasses specific control interface based on a connected external device type| JP5452313B2|2010-03-29|2014-03-26|ニフティ株式会社|Electronic commerce server and electronic commerce method| US8639440B2|2010-03-31|2014-01-28|International Business Machines Corporation|Augmented reality shopper routing| CN101893935B|2010-07-14|2012-01-11|北京航空航天大学|Cooperative construction method for enhancing realistic table-tennis system based on real rackets| US20120072233A1|2010-09-20|2012-03-22|Hanlon Alaina B|Medical health information system for health assessment, weight management and meal planning| US20120084812A1|2010-10-04|2012-04-05|Mark Thompson|System and Method for Integrating Interactive Advertising and Metadata Into Real Time Video Content| US10127606B2|2010-10-13|2018-11-13|Ebay Inc.|Augmented reality system and method for visualizing an item| US8660369B2|2010-10-25|2014-02-25|Disney Enterprises, Inc.|Systems and methods using mobile devices for augmented reality| US8698843B2|2010-11-02|2014-04-15|Google Inc.|Range of focus in an augmented reality application| US20120113141A1|2010-11-09|2012-05-10|Cbs Interactive Inc.|Techniques to visualize products using augmented reality| US20120120113A1|2010-11-15|2012-05-17|Eduardo Hueso|Method and apparatus for visualizing 2D product images integrated in a real-world environment| KR101292463B1|2011-01-27|2013-07-31|주식회사 팬택|Augmented reality system and method that share augmented reality service to remote| US20120197764A1|2011-02-02|2012-08-02|Ebay Inc.|Method and process of using metadata associated with a digital media to search for local inventory| CN102156810A|2011-03-30|2011-08-17|北京触角科技有限公司|Augmented reality real-time virtual fitting system and method thereof| US9330499B2|2011-05-20|2016-05-03|Microsoft Technology Licensing, Llc|Event augmentation with real-time information| CN102194007B|2011-05-31|2014-12-10|中国电信股份有限公司|System and method for acquiring mobile augmented reality information| US9973848B2|2011-06-21|2018-05-15|Amazon Technologies, Inc.|Signal-enhancing beamforming in an augmented reality environment| US9292603B2|2011-09-30|2016-03-22|Nuance Communications, Inc.|Receipt and processing of user-specified queries| US8230016B1|2011-10-11|2012-07-24|Google Inc.|Determining intent of a recommendation on a mobile application| AU2015264850B2|2011-10-27|2017-04-27|Ebay Inc.|Visualization of items using augmented reality| US9449342B2|2011-10-27|2016-09-20|Ebay Inc.|System and method for visualization of items in an environment using augmented reality| US20130116922A1|2011-11-08|2013-05-09|Hon Hai Precision Industry Co., Ltd.|Emergency guiding system, server and portable device using augmented reality| US9240059B2|2011-12-29|2016-01-19|Ebay Inc.|Personal augmented reality| US20130198002A1|2012-01-27|2013-08-01|Ebay Inc.|Method and process of using meta-data associated with a digital media to advertise local inventory based upon view gps location| US20130325839A1|2012-03-05|2013-12-05|TeleCommunication Communication Systems, Inc.|Single Search Box Global| US9336541B2|2012-09-21|2016-05-10|Paypal, Inc.|Augmented reality product instructions, tutorials and visualizations| US9773018B2|2013-08-13|2017-09-26|Ebay Inc.|Mapping item categories to ambiguous queries by geo-location|EP3239919A1|2008-03-05|2017-11-01|eBay Inc.|Method and apparatus for image recognition services| US9495386B2|2008-03-05|2016-11-15|Ebay Inc.|Identification of items depicted in images| US9164577B2|2009-12-22|2015-10-20|Ebay Inc.|Augmented reality system, method, and apparatus for displaying an item image in a contextual environment| US9449342B2|2011-10-27|2016-09-20|Ebay Inc.|System and method for visualization of items in an environment using augmented reality| US9240059B2|2011-12-29|2016-01-19|Ebay Inc.|Personal augmented reality| CN103197980B|2012-01-10|2016-03-30|华为终端有限公司|A kind of method, Apparatus and system presenting augmented reality content| US20160378861A1|2012-09-28|2016-12-29|Sri International|Real-time human-machine collaboration using big data driven augmented reality technologies| US10269179B2|2012-10-05|2019-04-23|Elwha Llc|Displaying second augmentations that are based on registered first augmentations| US9077647B2|2012-10-05|2015-07-07|Elwha Llc|Correlating user reactions with augmentations displayed through augmented views| US10713846B2|2012-10-05|2020-07-14|Elwha Llc|Systems and methods for sharing augmentation data| US10180715B2|2012-10-05|2019-01-15|Elwha Llc|Correlating user reaction with at least an aspect associated with an augmentation of an augmented view| US9092818B2|2013-01-31|2015-07-28|Wal-Mart Stores, Inc.|Method and system for answering a query from a consumer in a retail store| US20140244637A1|2013-02-22|2014-08-28|Chacha Search, Inc|Method and system of query processing| US10025486B2|2013-03-15|2018-07-17|Elwha Llc|Cross-reality select, drag, and drop for augmented reality systems| US10109075B2|2013-03-15|2018-10-23|Elwha Llc|Temporal element restoration in augmented reality systems| US9639964B2|2013-03-15|2017-05-02|Elwha Llc|Dynamically preserving scene elements in augmented reality systems| US10198448B2|2013-03-28|2019-02-05|Microsoft Technology Licensing, Llc|System and method for displaying social network analytics| US9773018B2|2013-08-13|2017-09-26|Ebay Inc.|Mapping item categories to ambiguous queries by geo-location| US20150134547A1|2013-11-09|2015-05-14|Artases OIKONOMIDIS|Belongings visualization and record system| US11026046B2|2014-04-11|2021-06-01|Flaregun Inc.|Apparatus, systems and methods for visually connecting people| US9351118B2|2014-04-11|2016-05-24|Keith Crutchfield|Apparatus, systems and methods for visually connecting people| US10726473B1|2014-04-30|2020-07-28|Wells Fargo Bank, N.A.|Augmented reality shopping rewards| US10839409B1|2014-04-30|2020-11-17|Wells Fargo Bank, N.A.|Augmented reality store and services orientation gamification| US10395292B1|2014-04-30|2019-08-27|Wells Fargo Bank, N.A.|Augmented reality electronic device using facial recognition functionality and displaying shopping reward at retail locations| KR102232517B1|2014-09-15|2021-03-26|삼성전자주식회사|Method for capturing image and image capturing apparatus| CN107209914A|2014-09-29|2017-09-26|奥瑞斯玛有限公司|Target lock-on activity in augmented reality| CA3021380A1|2015-04-20|2016-10-27|Tiltsta Pty Ltd|Interactive media system and method| US9934594B2|2015-09-09|2018-04-03|Spell Disain Ltd.|Textile-based augmented reality systems and methods| EP3448538A1|2016-04-29|2019-03-06|Hewlett-Packard Development Company, L.P.|Guidance information relating to a target image| US20180300916A1|2017-04-14|2018-10-18|Facebook, Inc.|Prompting creation of a networking system communication with augmented reality elements in a camera viewfinder display| EP3483104B1|2017-11-10|2021-09-01|Otis Elevator Company|Systems and methods for providing information regarding elevator systems| US10318811B1|2018-04-22|2019-06-11|Bubbler International Llc|Methods and systems for detecting objects by non-visible radio frequencies and displaying associated augmented reality effects| US10958959B1|2019-09-13|2021-03-23|At&T Intellectual Property I, L.P.|Automatic generation of augmented reality media|
法律状态:
2016-01-21| FGA| Letters patent sealed or granted (standard patent)| 2019-07-25| MK14| Patent ceased section 143(a) (annual fees not paid) or expired|
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US13/340,141||2011-12-29|| US13/340,141|US9240059B2|2011-12-29|2011-12-29|Personal augmented reality| PCT/US2012/071770|WO2013101903A2|2011-12-29|2012-12-27|Personal augmented reality|AU2015271902A| AU2015271902B2|2011-12-29|2015-12-17|Personal augmented reality| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|